BUILD A BETTER BRAIN
In one corner stands a stack of magazines. In
another sits the Sunday paper. On the counter, the radio crackles with
news, while nearby the fax machine hums. The information age is
definitely upon us.
If you're like most, you're still reeling,
struggling to take it all in, perhaps shutting down your input channels
entirely or jettisoning subscriptions simply to survive. And it's not
going to get any better. The outlook report from dataland is bleak:
Every five years, the information load is doubling.
There's nothing left to do but hope for a bigger, brighter brain.
What the data doctors can't even hope to
promise, science may yet deliver. In this, the Decade of the Brain,
researchers are hot on the trail of how we acquire and store
information. Merging psychology with biology, they have made a series
of recent discoveries that appear to catch learning in its tracks.
Neuroscientists plumbing this virgin terrain
now know that, along with genetic inheritance, experience shapes the
very structure of our nervous systems-it alters the brain circuits that
process everything from a French lesson to an auto repair guide.
Learning, the process by which we acquire information about our world,
may actually change our brains for the better. Animal research suggests
that the more we use our brains, the more efficient our intellectual
muscle gets.
Taken together, their work demonstrates that
the brain is an extraordinarily plastic organ, responding actively to a
novel environment by growing new connections to greet it. Although the
brain is unlike any other organ in that it lacks the ability for
cell-body renewal, nerve cells do generate new connections, or
synapses-the points at which signals are transmitted-forging new and
enhanced pathways for the flow of information.
These findings suggest you can essentially
train your brain to collate more information faster, and access it
quicker and better. And under the right conditions of stimulation, you
can grow yourself a brain that will keep up with your information
needs-perhaps even exceed them.
Nature does set certain parameters. We all
start out with about the same number of neurons, or brain cells, having
the same basic structure. By nine months of age, our nerve cells stop
dividing, leaving us with about 100 billion to a trillion each.
By far the most sophisticated thinking
machine known to man, the adult brain massively outperforms today's
best supercomputer. It processes billions of operations a
second-approximately 10[15], versus a mere 109 for the machine-all in
three pounds of tissue crammed inside the cranium. So densely packed is
the brain that a sample no larger than a grain of rice contains one
million neurons, 20 miles of axon-or the extension cords of nerve
cells-and 10 billion synapses, calculates California neurobiologist
Charles Stevens, Ph.D.
The vast majority of them are contained in
the cerebral cortex, or neocortex, the most recently evolved part of
the brain-a highly corrugated sheet of gray matter less than three
eighths of an inch deep that overlies most other brain structures. The
cortex accounts for 80 percent of its total volume and containing the
equipment responsible for many sensations, thoughts, imagery, language,
and other distinctively human abilities. Here, with the assistance of
other brain structures, is where the brain makes sense of received
stimuli, piecing together the signals from various sensory pathways,
connecting them and interconnecting them, and converting them into felt
experience.
Formerly the domain of philosophers, this
once-ethereal territory has only recently been opened for scientific
exploration. Using such technological advances as electrodes, gels,
high-powered microscopes and imaging devices including positron
emission tomograhy, or PET scans (think of them as maps of energy
flow), along with such low-tech equipment as sea slugs and rat brains,
neuroscientists are providing an unprecedented understanding of our
brains.
"Stimulation in general is very important to
the development of the brain," reports neurobiologist Carla Shatz,
Ph.D., of the University of California at Berkeley. While evolution has
programmed us to perform certain basic tasks necessary to sustain
life-such as eating and sleeping-we still have to learn how to do
almost everything else.
Researchers believe that, from birth to
adolescence, we are laying down the basic circuitry of the brain. As we
grow up, the world subsequently makes its mark physically. Exposure to
novel tasks and novel stimuli generates the development of new circuits
and synapses for handling all of them. From then on, continued
stimulation throughout life further strengthens these pathways and
enhances their interconnections.
Scientists cannot yet quantify exactly how
much an enriched environment helps the brains of young children to
grow. But "we do know that deprivation and isolation can result in
failure of the brain to form its rich set of connections," says Shatz.
Whether it's a new sensation or a fresh idea,
every outside stimulus is first converted into electrical signals as it
enters the cranium. These electrical signals trundle down known
pathways, splitting off into multiple directions for processing. Where
the lack of prior experience has left no established route, the signal
will forge a new one, linking neuron to neuron as it travels along. The
resulting chain is called a brain circuit, and the next time the same
stimulus enters the brain, it speeds efficiently along its old route,
now grooved into an expressway. Hundreds of millions of brain circuits
are created by millions of experiences.
Sometime near the high-school prom age-around
18 years old-networks stop forming. We are hard-wired by the end of
adolescence. Each of us is left with a "brainprint," or network system,
which like a fingerprint, is unique to each one of us. This is the
hardware that processes our thoughts.
Once it's in place, certain opportunities are
no longer available to us. If, for example, we learn a second language
after adolescence, it comes out sounding something like the first one.
"We are incapable of acquiring new languages without an accent after
adolescence," reports Mark Konishi, a biologist at California Institute
of Technology who studies bird-song development. During development the
connections form that process sound. But once our brains are so shaped,
Konishi says, "we probably use the same neural substrate to process new
sounds."
However, the brain is an enormously adaptive
organ: The connections between neurons proliferate and shrink depending
upon use. The links between them can be strengthened or weakened.
"Brain networks can always be fine-tuned," says neurobiologist Stevens,
of the Howard Hughes Medical Institute at Salk Institute in La Jolla.
The more synapses between cells, the more avenues for information
transmission. The better your cells communicate with one another, the
more information you can likely digest, understand and recall
efficiently.
"Smarter" people-those who can consume and
regurgitate facts with the efficiency of machines-may in fact have a
greater number of neural networks more intricately woven together. And
recall of any one part seems to summon up a whole web of information.
New pictures of the brain in action confirm
this model of efficiency of information flow. Last spring, researchers
scanning human brains by positron emission tomography (PET)-which
highlights the regions that work hardest during various tasks-found
that "smarter" brains consume less energy than other brains; to do the
same tasks they require less glucose, their favored fuel. "It maybe
that once the brain becomes really well grooved you don't need as much
energy," explains Eric Kandel, M.D., a neurobiologist at the Howard
Hughes Medical Institute at Columbia University in New York.
Perhaps that explains why rats raised in
enriched environments later learn faster than counterparts kept in
barren cages. And perhaps it will help researchers to understand a
recent controversial study showing a significant correlation between
low levels of education and the incidence of Alzheimer's disease.
According to neuroscientist Robert Katzman, Ph.D., of the University of
California at San Diego, individuals who lack formal education may
develop fewer synapses, or junctures between neurons, than individuals
who have routinely stretched their minds. Then, when disease occurs,
there is less brain reserve to call on, he says. When Alzheimer's
disease strikes them, the loss of synapses is dramatic and quick to
show.
Katzman hopes to directly investigate whether
the number of synapses in uneducated people is actually different from
that of educated people. In the meantime, neurobiologist Richard
Mayeux, Ph.D., of Columbia University, appears to have confirmed part
of what Katzman is getting at. He has shown that that people with high
IQs can withstand more brain scarring than less gifted people before
they show a noticeable loss of intellect.
So managing large amounts of information
throughout your life-as well as keeping your mind active into old
age-doesn't just make you smarter, it also appears to buy you some time
should you be stricken with a degenerative brain disease. And it can
also help you withstand the more everyday ravages of age.
What makes it possible to change our brains
to work faster and smarter? Human studies show that there are two kinds
of learning. Declarative-or factual-learning consists of the
acquisition of details about people, places, and things; it is presumed
to be highly associative, drawing on rich neural interconnections.
Procedural learning, on the other hand, involves information on how to
do things that utilize motor skills and perceptual stategies, such as
driving a car.
Each relies on different neural systems in
the brain. Procedural learning, more narrow-channel, involves the
specific sensory and motor systems underlying the particular skill.
Declarative learning is processed in the hippocampus-a small,
seahorse-shaped structure located at the base of the temporal lobe of
the cerebral cortex. Not only is the hippocampus central to the
formation and retrieval of lasting memories, it is part of the limbic
system, or emotional brain.
The details of the still-unfolding story of
the hippocampus and its role in the flow of information started with
amnesiacs and are spun increasingly from nonhuman sources.
Sophisticated as the new imaging technology is, it does not go far
enough to pin down the complex doings of the human brain at work. For
this, scientists have turned to the simpler systems of the sea snail
Aplysia, and to rats and cats, among other creatures.
The bet is that the basic cellular processes
in these neurons are similar to ours; that the same kinds of changes
that animate the brains of "lower" animals animate ours as well. Even
as this article goes to press, the various models are producing vast
amounts of information that provide an increasingly complete idea of
how brains input the newspaper stories, the lectures, the exhibits, the
news, and the noise we want to remember.
At the start of this chain, the sensory
organs-eyes, ears, nose, mouth, fingertips-transform stimuli into
rhythmic patterns of electrical impulses. Then, one by one, millions of
neurons pass the charge on to their neighbors. This process is
accomplished by chemical as well as electrical means.
Picture a nerve cell. Extending out from the
cell body in one direction is the axon, or output arm. Shorter
receiving cables, or dendrites, stem from other parts of the cell. The
ultrathin fibers of axon and dendrite terminate in tiny branches.
Between the axonal branches of one cell and the dendrite ends of the
next is an infinitesimal space-the synapse-which is the site of
communication between two neurons.
When an electric charge is sent from one cell
to the next, it is ferried across the synaptic gap with the help of
specialized chemicals knows as neurotransmitters. The neurotransmitter
influences the electrical conditions at the synapse, and the receiving
neuron fires if it collects enough charge, carrying the starting
stimulus to the next cell down the line.
A single neuron can send and receive
thousands of signals ,ha second. All this brain noise produces a
biological translation of the words that you just read. How this
message is eventually stored, or retained, is less clear. Most
neurobiologists suggest that memory involves some kind of sustained
changes in the neurons and their connections-perhaps similar to those
that occur during information acquisition. The cells that respond when
you recall Hamlet's Act Ill soliloquy, for instance, may be the same
ones that were throbbing when you were taking it in originally.
It is now widely accepted that memory is not
stored in a single cell but is spread out over an extensive neuronal
network. Each cell provides a tiny piece of a complex mosaic. "Even the
simplest memory is spread out over millions of neurons," Stevens says.
Along the same lines, memory recall appears
to involve multiple parts of the brain. The most convincing evidence
comes from a recent study by neuroscientists Marcus Raichle, Ph.D., of
Washington University in St. Louis, and San Diego's Larry Squire. The
two peered by PET scan into the brains of a group of subjects asked to
remember specific words. As the subjects reached back into their
memory, their brain images flashed all over with light, a sign many
sites were participating in the process. "Memory," says Raichle, "is
like a piece of music--it has lots of different parts that come
together to create the whole.'
Further, "we appear to have specialized
processing centers that act in different combinations when we recognize
something. We know that the hippocampus plays a critical role in laying
down new memories and recalling the recent past.' Raichle and others
think that memory formed in the hippocampus gets stored in neighboring
neocortex, a setup that frees the hippocampus for new tasks. No one,
however, knows for sure how a short-term memory, which lasts for a
couple of days at best, turns into long-term memory that can last a
lifetime.
It is a problem Columbia University's Eric
Kandel has been working on for three decades. In his painstaking
investigations into how experience changes the nervous system, or the
cellular and molecular mechanisms of learning and memory, he has
focused on the simple nervous system of the sea snail Aplysia. Its
20,000 neurons are the largest in the animal kingdom. For this Kandel
is considered by some to be the most reductive scientist of our times.
Along with magnificently accessible nerve
cells, Aplysia also has an easily observable behavior, the "gill
withdrawal" reflex. Tap on Aplysia's spout, or siphon, and the snail
withdraws its gill. Kandel found that if he shocked the snail's tail,
the creature became "sensitized." It learned to overreact, to instantly
withdraw its gill upon the slightest touch. (A basic form of learning,
sensitization takes place in humans as well.)
Once Kandel identified the key cells that
contribute to this type of learned behavior, he then looked for changes
that, with training, occurred within the cells. He and his colleagues
found that a single tail shock-which produces short-term memory for
sensitization-activates a cascade of cellular events in which the
sensory neuron releases more neurotransmitter so that the neural
connections Are strengthened between the sensory neuron from the siphon
and the motor neuron for the gill. As a result, the communication
between the sensory and motor cells becomes more efficient.
If not reinforced, this activity is
transient, and the increase in strength of the connections lasts only
minutes. However, when the tail shocks are repeated at least four or
five times long-term memory forms as a result of prodding the synthesis
of new proteins within the nerve cells. Under these conditions, Kandel
finds, the sensory neuron actually undergoes an anatomical change. The
neurotransmitter acts as a growth factor; there is a doubling of the
number of synaptic connections the sensory cells make onto the motor
cells. Now the cell is altered for a period of weeks so that it can
send messages more effectively than before, thereby enhancing
information processing within the brain.
That this phenomenon applies to you and me is
becoming increasingly clear. What Kandel has observed of Aplysia,
others have espied in mammals, including rats. University of Illinois
neuroscientist William Greenough, Ph.D., for one, has found that
neocortex neurons of rats reared in complex environments, and trained
in a maze every day, had more extensive dendrites than did comparison
animals. Their dendrites also sprouted more synapses. So experience
seems to change the brains of rats much as it does those of sea snails.
Other researchers have recently focused
increasing attention on another phenomenon, called long-term
potentiation (LTP), that also seems to be a component of associative
memory formation. To elicit the LTP response, researchers stick a probe
into one section of hippocampal tissue and stimulate it briefly but
intensely with electricity.
Later they stimulate the tissue with less
shock-but communication across the synapse is found to be stronger.
Moreover, the effect persists for days, sometimes weeks. This,
researchers believe, "looks an awful lot like learning," a case of
neuronal plasticity with an increase in synaptic response-in other
words, the creation of new channels that increase the efficiency of
information processing.
Scientists want to know precisely what
changes take place in LTP, whether the molecular changes associated
with LTP occur primarily in the receiver cell, the transmitting nerve
cell, or in both. A variety of mechanisms affect synaptic strength.
Perhaps various combinations of these determine how different forms of
learning occur-for example, how facts are acquired versus how skills
are retained.
Still, enough of the evidence is in to walk
away with growing certainty. Simply making the attempt to keep up
appears to stretch and strengthen our minds physically. And it may give
us an edge against degenerative disease. We may survive the information
age after all. And exit it in better shape than it found us.
Illustration: (SCOTT MAC NEILL)
~~~~~~~~
By Beth Livermore
Copyright
of Psychology Today is the property of Sussex Publishers Inc. and its
content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder's express written permission.
However, users may print, download, or email articles for individual
use.